247 research outputs found

    Beyond Six Sigma – A Control Chart for Tracking Defects per Billion Opportunities (dpbo)

    Get PDF
    In many processes and, in particular, those related to electronics packaging and assembly, the amount of possible defect counts per unit of product has become quite large. Because the classical attributes-based statistical process control (SPC) charts where defects are measured in counts – the u chart and c chart, for examples – were cumbersome to cope with such large scale defect possibilities, a defects per million opportunities (dpmo) chart was developed in the mid 1990s. Not long after this, it was mentioned by a researcher at Packard Bell that “world-class” in surface mount technology (SMT) – an assembly technique for electronics manufacturing – should indicate that the company is operating at 50ppm (or fewer) defect levels and suggested that the number could drop to 10ppm. Furthermore, it was hypothesized that the electronics industry may one day refer to defect levels in terms of parts-per-billion (ppb) – defect levels reflective of process capabilities better than six sigma levels. With this in mind, this paper presents a new control chart for attributes data measured in counts where the plot point per period is represented by defects per billion opportunities (dpbo). In addition to showing the plot point and control chart calculations, an example will be provided and analyzed to demonstrate its use

    A Heuristic Based On Makespan Lower Bounds in Flow Shops with Multiple Processors

    Get PDF
    Minimum makespan scheduling of Flow Shops with Multiple Processors (FSMPs), also known as the Hybrid Flow Shop (HFS), is classified as NP complete. Thus, the FSMP largely depends on strong heuristics to develop solutions to makespan scheduling instances. An FSMP consists of m stages wherein each stage has one or more processors through which n jobs are scheduled. This paper presents a heuristic based on the lower bound developed in a prior work in order to determine good makespan solutions in the FSMP environment. In the environment studied in this work, the multiple machines available at a particular processing stage are identical processors. In order to evaluate the proposed heuristic, its performance is compared to makespans obtained via the use of modified pure flow shop heuristics. Results show that the proposed heuristic is indeed a strong heuristic for the FSMP and it provides makespans that are better than those provided by some of the already existing pure flow shop heuristics that have been adapted for the FSMP environment

    Minimizing The Number of Tardy Jobs in Hybrid Flow Shops with Non-Identical Multiple Processors

    Get PDF
    Two-stage hybrid flow shops (a.k.a., flow shops with multiple processors (FSMPs)) are studied wherein the multiple processors at a stage are non-identical, but related (a.k.a., uniform) in their processing speeds.   The impact of ten different dispatching procedures on a due-date based criterion (specifically, the number of tardy jobs) is analyzed over a set of 1,800 problems of varying configurations wherein the number of jobs per problem is between 20 and 100 and their due dates are randomly assigned.  Results indicate that the modified due date (MDD), earliest due date (EDD), slack (SLK), shortest processing time (SPT), and least work remaining (LWR) rules are statistically inseparable but yield superior performance to the other rules included in this study.  The longest processing time (LPT) and most work remaining (MWR) rules provide the poorest performance

    LNG Facility Siting – An Alternative Approach for Vapor Cloud Reduction

    Get PDF
    PresentationThe siting of Marine LNG facilities in the United States requires application of the Title 49 of the Code of Federal Regulations (CFR) Part 193, Liquefied Natural Gas Facilities: Federal Safety Standards[1], and NFPA 59A – 2001 Edition, Standard for the Production, Storage, and Handling of Liquefied Natural Gas [2]. In addition, the guidance for LNG siting application available in the PHMSA (US DOT Pipeline and Hazardous Materials Safety Administration) web site is also required. One of the most important items for LNG facility siting is the Hazards Analysis, which consists in the identification of the SALS (Single Accidental Leakage Source) by analyzing all piping in the facility. The SALSs for conventional piping are defined based on the size and length of the lines and the application of a failure rate table provided by PHMSA [3]. This methodology may generate scenarios with large flammable gas clouds, especially for long lines such as the LNG loading line and rundown line. Vapor barriers are a design solution acceptable by the regulators and commonly used to prevent vapor clouds from reaching a property that could be built upon [4]. This paper presents the application of Pipe-in-Pipe (PiP) technology for the LNG rundown line and the LNG loading line (which runs over a marine trestle). The Pipe-in-Pipe consists of an inner pipe designed for the process conditions of the particular service, insulation material wrapping the inner pipe, and an outer pipe. The outer pipe is designed to provide full containment in the unlikely event of a leak from the inner pipe and to withstand any thermal deformation due to exposure to cryogenic temperatures. Any leakage from the inner pipe is directed to the flare system. With the application of PiP technology for the LNG loading line and rundown line, any potential leaks in the inner piping will be contained by the outer pipe and directed to a safe disposition. If approved by FERC and USCG, this technology will allow proposed LNG projects the potential for reduced flammable gas clouds, reducing the need for other mitigations, such as vapor barriers. Another advantage of the PiP technology is that any leak in the marine area will be contained by the outer pipe, allowing the reduction of the liquid containment system for facilities with long trestles over water

    Different cultures, different values: The role of cultural variation in public’s WTP for marine species conservation

    Get PDF
    Understanding the cultural variation in public preference for marine species is a necessary pre-requisite if conservation objectives are to include societal preferences in addition to scientific considerations. We report the results of a contingent study undertaken at three case-study sites: Azores islands (Portugal), Gulf of Gdansk (Poland) and Isles of Scilly (UK). The study considered species richness of five specific marine taxa (mammals, birds, fish, invertebrates and algae) as proxies of marine biodiversity and the aim of analysis was to estimate from a multi-site perspective public’s willingness to pay (WTP) to avoid increased levels of species loss (reduction of species richness) for different marine taxa. Results, based on 1502 face-to-face interviews, showed that income, education and environmental awareness of the respondents were significant predictors of WTP for marine species conservation. Results also indicated that respondents in each of the European locations had different preferences for marine taxa. In the Azores, although mammals and fish were valued highly, small differences occurred in the WTP among different taxa. Respondents in the Isles of Scilly put a relatively low value on fish while algae and marine mammals were highly valued. In Gdansk, respondents defined a clear order of preference for marine mammals>fish>birds>invertebrates and algae. These findings suggested that cultural differences may be important drivers of valuation and undermines the commonly held premise that charismatic/likeable taxa consistently have a disproportionately strong influence on WTP for biodiversity conservation. We conclude that conservation policy must take account of cultural diversity alongside biological diversity

    Investigative Study on Preprint Journal Club as an Effective Method of Teaching Latest Knowledge in Astronomy

    Full text link
    As recent advancements in physics and astronomy rapidly rewrite textbooks, there is a growing need in keeping abreast of the latest knowledge in these fields. Reading preprints is one of the effective ways to do this. By having journal clubs where people can read and discuss journals together, the benefits of reading journals become more prevalent. We present an investigative study of understanding the factors that affect the success of preprint journal clubs in astronomy, more commonly known as Astro-ph/Astro-Coffee (hereafter called AC). A survey was disseminated to understand how institutions from different countries implement AC. We interviewed 9 survey respondents and from their responses we identified four important factors that make AC successful: commitment (how the organizer and attendees participate in AC), environment (how conducive and comfortable AC is conducted), content (the discussed topics in AC and how they are presented), and objective (the main goal/s of conducting AC). We also present the format of our AC, an elective class which was evaluated during the Spring Semester 2020 (March 2020 - June 2020). Our evaluation with the attendees showed that enrollees (those who are enrolled and are required to present papers regularly) tend to be more committed in attending compared to audiences (those who are not enrolled and are not required to present papers regularly). In addition, participants tend to find papers outside their research field harder to read. Finally, we showed an improvement in the weekly number of papers read after attending AC of those who present papers regularly, and a high satisfaction rating of our AC. We summarize the areas of improvement in our AC implementation, and we encourage other institutions to evaluate their own AC in accordance with the four aforementioned factors to assess the effectiveness of their AC in reaching their goals.Comment: Accepted for publication in PRPER. A summary video is available at http://www.youtube.com/watch?v=fzy2I_xA_dU&ab_channel=NthuCosmolog

    Can luminous Lyman alpha emitters at zz \simeq 5.7 and zz \simeq 6.6 suppress star formation?

    Full text link
    Addressing how strong UV radiation affects galaxy formation is central to understanding their evolution. The quenching of star formation via strong UV radiation (from starbursts or AGN) has been proposed in various scenes to solve certain astrophysical problems. Around luminous sources, some evidence of decreased star formation has been found but is limited to a handful of individual cases. No direct, conclusive evidence on the actual role of strong UV radiation in quenching star formation has been found. Here we present statistical evidence of decreased number density of faint (AB magnitude \geq 24.75 mag) Ly\alpha emitters (LAEs) around bright (AB magnitude < 24.75 mag) LAEs even when the radius goes up to 10 pMpc for zz \simeq 5.7 LAEs. A similar trend is found for z \simeq 6.6 LAEs but only within 1 pMpc radius from the bright LAEs. We use a large sample of 1077 (962) LAEs at zz \simeq 5.7 (zz \simeq 6.6) selected in total areas of 14 (21) deg2^2 with Subaru/Hyper Suprime-Cam narrow-band data, and thus, the result is of statistical significance for the first time at these high redshift ranges. A simple analytical calculation indicates that the radiation from the central LAE is not enough to suppress LAEs with AB mag \geq 24.75 mag around them, suggesting additional physical mechanisms we are unaware of are at work. Our results clearly show that the environment is at work for the galaxy formation at zz \sim 6 in the Universe.Comment: Accepted for publication at MNRA
    corecore